Lecture 11 : Shannon vs . Hamming
نویسندگان
چکیده
In the last lecture, we proved the positive part of Shannon's capacity theorem for the BSC. We showed that by the probabilistic method, there exists an encoding function E and a decoding function D such that E m Pr noise e of BSCp [D(E(m) + e) = m] ≤ 2 −δ ′ n. (1) In other words, the average decoding error probability is small. However, we need to show that the maximum decoding error probability over all messages is small. In the last lecture, we quickly went over how (1) implies the latter. We will start today's lecture by going over this argument again. As was mentioned in the last lecture, the trick is to throw away all the messages that have high error probability. In particular, we only keep the messages with probability error at most the median error probability.
منابع مشابه
Lecture 11 : Shannon vs . Hamming September 21 , 2007
In the last lecture, we proved the positive part of Shannon's capacity theorem for the BSC. We showed that by the probabilistic method, there exists an encoding function E and a decoding function D such that E m Pr noise e of BSCp [D(E(m) + e) = m] ≤ 2 −δ ′ n. (1) In other words, the average decoding error probability is small. However, we need to show that the maximum decoding error probabilit...
متن کاملLecture 1 — February 6 , 2013
The two papers are very interelated but they have different prespectives on what is an error and how to correct it. In the Shannon model, the errors are drawn from a probability distribution whereas they are chosen adversarially in the Hamming model. In the rest of this lecture, we will introduce the Haming model. First, we give an example with a concrete setting of parameters before giving the...
متن کاملRepresentations of Genetic Tables, Bimagic Squares, Hamming Distances and Shannon Entropy
In this paper we have established relations of the genetic tables with magic and bimagic squares. Connections with Hamming distances, binomial coefficients are established. The idea of Gray code is applied. Shannon entropy of magic squares of order 4 × 4, 8 × 8 and 16 × 16 are also calculated. Some comparison is also made. Symmetry among restriction enzymes having four letters is also studied.
متن کاملCs 59000 Ctt Current Topics in Theoretical Cs
We introduced error-correcting codes and linear codes in the last lecture. In this lecture we will discuss in some more details properties of linear codes and we’ll describe classical examples of linear codes. We will also show the Hamming bound, which is a bound that relates the distance, rate, and block length parameters of codes and is tight for the Hamming codes. Recall that a (n, k, d)q co...
متن کاملThe Performance of Block Codes, Volume 49, Number 1
I n his classic 1948 paper [10], Claude Shannon introduced the notions of communication channels and codes to communicate over them. During the following two decades, he remained active in refining and extending the theory. One of Shannon’s favorite research topics was the fundamental performance capabilities of long block codes. In the 1950s and 1960s this topic also attracted the active invol...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2007